Sparse additive regression on a regular lattice

نویسندگان

  • Felix Abramovich
  • Tal Lahav
چکیده

We consider estimation in a sparse additive regression model with the design points on a regular lattice. We establish the minimax convergence rates over Sobolev classes and propose a Fourier-based rate optimal estimator which is adaptive to the unknown sparsity and smoothness of the response function. The estimator is derived within a Bayesian formalism but can be naturally viewed as a penalized maximum likelihood estimator with the complexity penalties on the number of non-zero univariate additive components of the response and on the numbers of the non-zero coefficients of their Fourer expansions.We compare it with several existing counterparts and perform a short simulation study to demonstrate its performance.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric Greedy Algorithms for the Sparse Learning Problem

This paper studies the forward greedy strategy in sparse nonparametric regression. For additive models, we propose an algorithm called additive forward regression; for general multivariate models, we propose an algorithm called generalized forward regression. Both algorithms simultaneously conduct estimation and variable selection in nonparametric settings for the high dimensional sparse learni...

متن کامل

Nonparametric regression and classification with joint sparsity constraints

We propose new families of models and algorithms for high-dimensional nonparametric learning with joint sparsity constraints. Our approach is based on a regularization method that enforces common sparsity patterns across different function components in a nonparametric additive model. The algorithms employ a coordinate descent approach that is based on a functional soft-thresholding operator. T...

متن کامل

Additive Sparse Grid Fitting

We propose an iterative algorithm for high-dimensional sparse grid regression with penalty. The algorithm is of additive Schwarz type and is thus parallel. We show that the convergence of the algorithm is better than additive Schwarz and examples demonstrate that convergence is between that of additive Schwarz and multiplicative Schwarz procedures. Similarly, the method shows improved performan...

متن کامل

SpAM: Sparse Additive Models

We present a new class of models for high-dimensional nonparametric regression and classification called sparse additive models (SpAM). Our methods combine ideas from sparse linear modeling and additive nonparametric regression. We derive a method for fitting the models that is effective even when the number of covariates is larger than the sample size. A statistical analysis of the properties ...

متن کامل

Comparing the Bidirectional Baum-Welch Algorithm and the Baum-Welch Algorithm on Regular Lattice

A profile hidden Markov model (PHMM) is widely used in assigning protein sequences to protein families. In this model, the hidden states only depend on the previous hidden state and observations are independent given hidden states. In other words, in the PHMM, only the information of the left side of a hidden state is considered. However, it makes sense that considering the information of the b...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015